Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Low-rank representation subspace clustering method based on Hessian regularization and non-negative constraint
Lili FAN, Guifu LU, Ganyi TANG, Dan YANG
Journal of Computer Applications    2022, 42 (1): 115-122.   DOI: 10.11772/j.issn.1001-9081.2021071181
Abstract241)   HTML11)    PDF (661KB)(181)       Save

Focusing on the issue that the Low-Rank Representation (LRR) subspace clustering algorithm does not consider the local structure of the data and may cause the loss of local similar information during learning, a Low-Rank Representation subspace clustering algorithm based on Hessian regularization and Non-negative constraint (LRR-HN) was proposed to explore the global and local structure of the data. Firstly, the good speculative ability of Hessian regularization was used to maintain the local manifold structure of the data, so that the local topological structure of the data was more expressive. Secondly, considering that the obtained coefficient matrix often has positive and negative values, and the negative values often have no practical significance, non-negative constraints were introduced to ensure the effectiveness of the model solution and make it more meaningful in the description of the local structure of the data. Finally, the low-rank representation of the global structure of the data was sought by minimizing the nuclear norm, so as to cluster high-dimensional data better. In addition, an effective algorithm for solving LRR-HN was designed by using the linearized alternating direction method with adaptive penalty, and the proposed algorithm was evaluated by ACcuracy (AC) and Normalized Mutual Information (NMI) on some real datasets. In the experiments with clusters number 20 on ORL dataset, compared with LRR algorithm, LRR-HN has the AC and NMI increased by 11% and 9.74% respectively, and compared with Adaptive Low-Rank Representation (ALRR) algorithm, LRR-HN has the AC and NMI increased by 5% and 1.05% respectively. Experimental results show that the LRR-HN has great improvement in AC and NMI compared with some existing algorithms, and has the excellent clustering performance.

Table and Figures | Reference | Related Articles | Metrics
Manifold regularized nonnegative matrix factorization based on clean data
Hua LI, Guifu LU, Qinru YU
Journal of Computer Applications    2021, 41 (12): 3492-3498.   DOI: 10.11772/j.issn.1001-9081.2021060962
Abstract242)   HTML6)    PDF (663KB)(124)       Save

The existing Nonnegative Matrix Factorization (NMF) algorithms are often designed based on Euclidean distance, which makes the algorithms sensitive to noise. In order to enhance the robustness of these algorithms, a Manifold Regularized Nonnegative Matrix Factorization based on Clean Data (MRNMF/CD) algorithm was proposed. In MRNMF/CD algorithm, the low-rank constraints, manifold regularization and NMF technologies were seamlessly integrated, which makes the algorithm perform relatively excellent. Firstly, by adding the low-rank constraints, MRNMF/CD can recover clean data from noisy data and obtain the global structure of the data. Secondly, in order to use the local geometric structure information of the data, manifold regularization was incorporated into the objective function by MRNMF/CD. In addition, an iterative algorithm for solving MRNMF/CD was proposed, and the convergence of this solution algorithm was analyzed theoretically. Experimental results on ORL, Yale and COIL20 datasets show that MRNMF/CD algorithm has better accuracy than the existing algorithms including k-means, Principal Component Analysis (PCA), NMF and Graph Regularized Nonnegative Matrix Factorization (GNMF).

Table and Figures | Reference | Related Articles | Metrics